Algorithms for Nonnegative Matrix Factorization with the Kullback–Leibler Divergence

نویسندگان

چکیده

Nonnegative matrix factorization (NMF) is a standard linear dimensionality reduction technique for nonnegative data sets. In order to measure the discrepancy between input and low-rank approximation, Kullback-Leibler (KL) divergence one of most widely used objective function NMF. It corresponds maximum likehood estimator when underlying statistics observed sample follows Poisson distribution, KL NMF particularly meaningful count sets, such as documents or images. this paper, we first collect important properties that are essential study convergence algorithms. Second, together with reviewing existing algorithms solving NMF, propose three new guarantee non-increasingness function. We also provide global our proposed Finally, conduct extensive numerical experiments comprehensive picture performances

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Algorithms for Nonnegative Matrix Factorization with the β-Divergence

This letter describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parameterized by a single shape parameter β that takes the Euclidean distance, the Kullback-Leibler divergence, and the Itakura-Saito divergence as special cases (β = 2, 1, 0 respectively). The proposed algorithms are based on a surrogate auxi...

متن کامل

Algorithms for nonnegative matrix factorization with the beta-divergence

This paper describes algorithms for nonnegative matrix factorization (NMF) with the β-divergence (β-NMF). The β-divergence is a family of cost functions parametrized by a single shape parameter β that takes the Euclidean distance, the Kullback-Leibler divergence and the Itakura-Saito divergence as special cases (β = 2, 1, 0 respectively). The proposed algorithms are based on a surrogate auxilia...

متن کامل

Nonnegative Matrix Factorization with the β-Divergence”

The equivalence can be formalized as follows: For a particular c in (21), there is a corresponding δ > 0 in the optimization in (A-1). We focus on `1-ARD where f(x) = ‖x‖1. Then the objective is concave in H. One natural way to solve (A-1) iteratively is to use an MM procedure by upper bounding the objective function with its tangent (first-order Taylor expansion) at the current iterate H. This...

متن کامل

Projective Nonnegative Matrix Factorization with α-Divergence

A new matrix factorization algorithm which combines two recently proposed nonnegative learning techniques is presented. Our new algorithm, α-PNMF, inherits the advantages of Projective Nonnegative Matrix Factorization (PNMF) for learning a highly orthogonal factor matrix. When the Kullback-Leibler (KL) divergence is generalized to αdivergence, it gives our method more flexibility in approximati...

متن کامل

Nonnegative matrix factorization with α-divergence

Nonnegative matrix factorization (NMF) is a popular technique for pattern recognition, data analysis, and dimensionality reduction, the goal of which is to decompose nonnegative data matrix X into a product of basis matrix A and encoding variable matrix S with both A and S allowed to have only nonnegative elements. In this paper we consider Amari’s α-divergence as a discrepancy measure and rigo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Scientific Computing

سال: 2021

ISSN: ['1573-7691', '0885-7474']

DOI: https://doi.org/10.1007/s10915-021-01504-0